1,415 research outputs found

    The cosmological constant and the relaxed universe

    Full text link
    We study the role of the cosmological constant (CC) as a component of dark energy (DE). It is argued that the cosmological term is in general unavoidable and it should not be ignored even when dynamical DE sources are considered. From the theoretical point of view quantum zero-point energy and phase transitions suggest a CC of large magnitude in contrast to its tiny observed value. Simply relieving this disaccord with a counterterm requires extreme fine-tuning which is referred to as the old CC problem. To avoid it, we discuss some recent approaches for neutralising a large CC dynamically without adding a fine-tuned counterterm. This can be realised by an effective DE component which relaxes the cosmic expansion by counteracting the effect of the large CC. Alternatively, a CC filter is constructed by modifying gravity to make it insensitive to vacuum energy.Comment: 6 pages, no figures, based on a talk presented at PASCOS 201

    Relaxing a large cosmological constant in the astrophysical domain

    Full text link
    We study the problem of relaxing a large cosmological constant in the astrophysical domain through a dynamical mechanism based on a modified action of gravity previously considered by us at the cosmological level. We solve the model in the Schwarzschild-de Sitter metric for large and small astrophysical scales, and address its physical interpretation by separately studying the Jordan's frame and Einstein's frame formulations of it. In particular, we determine the extremely weak strength of fifth forces in our model and show that they are virtually unobservable. Finally, we estimate the influence that the relaxation mechanism may have on pulling apart the values of the two gravitational potentials Psi and Phi of the metric, as this implies a departure of the model from General Relativity and could eventually provide an observational test of the new framework at large astrophysical scales, e.g. through gravitational lensing.Comment: 14 pages, 3 figures, accepted in Mod. Phys. Lett. A, extended discussion, references adde

    Linking gene regulation and the exo-metabolome: A comparative transcriptomics approach to identify genes that impact on the production of volatile aroma compounds in yeast

    Get PDF
    BACKGROUND: 'Omics' tools provide novel opportunities for system-wide analysis of complex cellular functions. Secondary metabolism is an example of a complex network of biochemical pathways, which, although well mapped from a biochemical point of view, is not well understood with regards to its physiological roles and genetic and biochemical regulation. Many of the metabolites produced by this network such as higher alcohols and esters are significant aroma impact compounds in fermentation products, and different yeast strains are known to produce highly divergent aroma profiles. Here, we investigated whether we can predict the impact of specific genes of known or unknown function on this metabolic network by combining whole transcriptome and partial exo-metabolome analysis. RESULTS: For this purpose, the gene expression levels of five different industrial wine yeast strains that produce divergent aroma profiles were established at three different time points of alcoholic fermentation in synthetic wine must. A matrix of gene expression data was generated and integrated with the concentrations of volatile aroma compounds measured at the same time points. This relatively unbiased approach to the study of volatile aroma compounds enabled us to identify candidate genes for aroma profile modification. Five of these genes, namely YMR210W, BAT1, AAD10, AAD14 and ACS1 were selected for overexpression in commercial wine yeast, VIN13. Analysis of the data show a statistically significant correlation between the changes in the exo-metabome of the overexpressing strains and the changes that were predicted based on the unbiased alignment of transcriptomic and exo-metabolomic data. CONCLUSION: The data suggest that a comparative transcriptomics and metabolomics approach can be used to identify the metabolic impacts of the expression of individual genes in complex systems, and the amenability of transcriptomic data to direct applications of biotechnological relevance

    Perturbations in the relaxation mechanism for a large cosmological constant

    Full text link
    Recently, a mechanism for relaxing a large cosmological constant (CC) has been proposed [arxiv:0902.2215], which permits solutions with low Hubble rates at late times without fine-tuning. The setup is implemented in the LXCDM framework, and we found a reasonable cosmological background evolution similar to the LCDM model with a fine-tuned CC. In this work we analyse analytically the perturbations in this relaxation model, and we show that their evolution is also similar to the LCDM model, especially in the matter era. Some tracking properties of the vacuum energy are discussed, too.Comment: 18 pages, LaTeX; discussion improved, accepted by CQ

    Physical modeling of echelle spectrographs: the CARMENES case study

    Get PDF
    We have developed a generic physical modeling scheme for high resolution spectroscopy based on simple optical principles. This model predicts the position of centroids for a given set of spectral features with high accuracy. It considers off-plane grating equations and rotations of the different optical elements in order to properly account for tilts in the spectral lines and order curvature. In this way any astronomical spectrograph can be modeled and controlled without the need of commercial ray tracing software. The computations are based on direct ray tracing applying exact corrections to certain surfaces types. This allows us to compute the position on the detector of any spectral feature with high reliability. The parameters of this model, which describe the physical properties of the spectrograph, are continuously optimized to ensure the best possible fit to the observed spectral line positions. We present the physical modeling of CARMENES as a case study. We show that our results are in agreement with commercial ray tracing software. The model prediction matches the observations at a pixel size level, providing an efficient tool in the design, construction and data reduction of high resolution spectrographs. © 2018 SPIE

    Investigation of olfactory interactions of low levels of five off-flavour causing compounds in a red wine matrix

    Get PDF
    CITATION: McKay, M. et al. 2020. Investigation of olfactory interactions of low levels of five off-flavour causing compounds in a red wine matrix. Food Research International, 128. doi:10.1016/j.foodres.2019.108878.The original publication is available at https://www.sciencedirect.com/journal/food-research-internationalThe qualitative sensory perception of individual and of complex mixtures of five compounds, guaiacol (‘burnt note’), o-cresol (‘phenolic/tar’), 4-ethylphenol (4-EP, ‘leather/barnyard’), 2-iso-butyl-3-methoxypyrazine (IBMP, ‘green pepper/herbaceous’), and 2,4,6-trichloroanisole (TCA, ‘cork taint/ mouldy’) were tested in a partially de-aromatised red wine matrix using descriptive analysis by a trained panel of eleven judges. Compounds were characterised at peri- and sub-threshold concentrations using a partial D-optimal statistical design and response surface methodology. Results indicated that complex mixtures in red wine elicit an olfactory response that could not be predicted from the attributes or descriptors of single compounds. Positive sweet/fruity attributes were more intense in solutions containing fewer off-flavour compounds. Novel findings of this study include that IBMP at sub- and peri-threshold levels shows perceptual interaction with volatile phenols at the same levels, and samples containing combinations of these compounds manifested herbaceous and burnt characteristics. Olfactory interactions of this many off-flavour compounds have not been investigated previously in one study. The findings have direct implications for wines made from cultivars that are known to contain these compounds, and add to the understanding of the behaviour and impact of very low levels (peri- and sub-threshold) of volatile phenols, IBMP, and TCA derived from various sources during winemaking.https://www.sciencedirect.com/science/article/pii/S0963996919307641?via%3DihubPublishers versio

    Filtering out the cosmological constant in the Palatini formalism of modified gravity

    Full text link
    According to theoretical physics the cosmological constant (CC) is expected to be much larger in magnitude than other energy densities in the universe, which is in stark contrast to the observed Big Bang evolution. We address this old CC problem not by introducing an extremely fine-tuned counterterm, but in the context of modified gravity in the Palatini formalism. In our model the large CC term is filtered out, and it does not prevent a standard cosmological evolution. We discuss the filter effect in the epochs of radiation and matter domination as well as in the asymptotic de Sitter future. The final expansion rate can be much lower than inferred from the large CC without using a fine-tuned counterterm. Finally, we show that the CC filter works also in the Kottler (Schwarzschild-de Sitter) metric describing a black hole environment with a CC compatible to the future de Sitter cosmos.Comment: 22 pages, 1 figure, discussion extended, references added, accepted by Gen.Rel.Gra

    A Formalism for the Systematic Treatment of Rapidity Logarithms in Quantum Field Theory

    Get PDF
    Many observables in QCD rely upon the resummation of perturbation theory to retain predictive power. Resummation follows after one factorizes the cross section into the rele- vant modes. The class of observables which are sensitive to soft recoil effects are particularly challenging to factorize and resum since they involve rapidity logarithms. In this paper we will present a formalism which allows one to factorize and resum the perturbative series for such observables in a systematic fashion through the notion of a "rapidity renormalization group". That is, a Collin-Soper like equation is realized as a renormalization group equation, but has a more universal applicability to observables beyond the traditional transverse momentum dependent parton distribution functions (TMDPDFs) and the Sudakov form factor. This formalism has the feature that it allows one to track the (non-standard) scheme dependence which is inherent in any scenario where one performs a resummation of rapidity divergences. We present a pedagogical introduction to the formalism by applying it to the well-known massive Sudakov form factor. The formalism is then used to study observables of current interest. A factorization theorem for the transverse momentum distribution of Higgs production is presented along with the result for the resummed cross section at NLL. Our formalism allows one to define gauge invariant TMDPDFs which are independent of both the hard scattering amplitude and the soft function, i.e. they are uni- versal. We present details of the factorization and resummation of the jet broadening cross section including a renormalization in pT space. We furthermore show how to regulate and renormalize exclusive processes which are plagued by endpoint singularities in such a way as to allow for a consistent resummation.Comment: Typos in Appendix C corrected, as well as a typo in eq. 5.6

    Association between infectious burden, socioeconomic status, and ischemic stroke

    Get PDF
    Background and aims: Infectious diseases contribute to stroke risk, and are associated with socioeconomic status (SES). We tested the hypotheses that the aggregate burden of infections increases the risk of ischemic stroke (IS) and partly explains the association between low SES and ischemic stroke. Methods: In a case-control study with 470 ischemic stroke patients and 809 age- and sex-matched controls, randomly selected from the population, antibodies against the periodontal microbial agents Aggregatibacter actinomycetemcomitans and Porphyromonas gingivalis, against Chlamydia pneumonia, Mycoplasma pneumoniae (IgA and IgG), and CagA-positive Helicobacter pylori (IgG) were assessed. Results: IgA seropositivity to two microbial agents was significantly associated with IS after adjustment for SES (OR 1.45 95% CI 1.01-2.08), but not in the fully adjusted model (OR 1.32 95% CI 0.86-2.02). By trend, cumulative IgA seropositivity was associated with stroke due to large vessel disease (LVD) after full adjustment (OR 1.88, 95% CI 0.96e3.69). Disadvantageous childhood SES was associated with higher cumulative seropositivity in univariable analyses, however, its strong impact on stroke risk was not influenced by seroepidemiological data in the multivariable model. The strong association between adulthood SES and stroke was rendered nonsignificant when factors of dental care were adjusted for. Conclusions: Infectious burden assessed with five microbial agents did not independently contribute to ischemic stroke consistently, but may contribute to stroke due to LVD. High infectious burden may not explain the association between childhood SES and stroke risk. Lifestyle factors that include dental negligence may contribute to the association between disadvantageous adulthood SES and stroke. (C) 2016 Elsevier Ireland Ltd. All rights reserved.Peer reviewe

    Dynamically avoiding fine-tuning the cosmological constant: the "Relaxed Universe"

    Full text link
    We demonstrate that there exists a large class of action functionals of the scalar curvature and of the Gauss-Bonnet invariant which are able to relax dynamically a large cosmological constant (CC), whatever it be its starting value in the early universe. Hence, it is possible to understand, without fine-tuning, the very small current value of the CC as compared to its theoretically expected large value in quantum field theory and string theory. In our framework, this relaxation appears as a pure gravitational effect, where no ad hoc scalar fields are needed. The action involves a positive power of a characteristic mass parameter, M, whose value can be, interestingly enough, of the order of a typical particle physics mass of the Standard Model of the strong and electroweak interactions or extensions thereof, including the neutrino mass. The model universe emerging from this scenario (the "Relaxed Universe") falls within the class of the so-called LXCDM models of the cosmic evolution. Therefore, there is a "cosmon" entity X (represented by an effective object, not a field), which in this case is generated by the effective functional and is responsible for the dynamical adjustment of the cosmological constant. This model universe successfully mimics the essential past epochs of the standard (or "concordance") cosmological model (LCDM). Furthermore, it provides interesting clues to the coincidence problem and it may even connect naturally with primordial inflation.Comment: LaTeX, 63 pp, 8 figures. Extended discussion. Version accepted in JCA
    • …
    corecore